Search Results: "Jonathan Dowland"

23 October 2023

Jonathan Dowland: cherished

minidisc player
iPod
Bose headphones
If I think back to technology I've used and really cherished, quite often they're audio-related: Minidisc players, Walkmans, MP3 players, headphones. These pieces of technology served as vessels to access music, which of course I often have fond emotional connection to. And so I think the tech has benefited from that, and in some way the fondness or emotional connection to music has somewhat transferred or rubbed-off on the technology to access it. Put another way, no matter how well engineered it was, how easy it was to use or how well it did the job, I doubt I'd have fond memories, years later, of a toilet brush. I wonder if the same "bleeding" of fondness applies to brands, too. If so, and if you were a large tech company, it would be worth having some audio gear in your portfolio. I think Sony must have benefited from this. Apple too. on-ear phones For listening on-the-go, I really like on-ear headphones, as opposed to over-ear. I have some lovely over-ear phones for listening-at-rest, but they get my head too hot when I'm active. The on-ears are a nice compromise between comfort and quality of over-ear, and portability of in-ear. Most of the ones I've owned have folded up nicely into a coat pocket too. My current Bose pair are from 2019 and might be towards the end of their life. They replaced some AKG K451s, which were also discontinued. Last time I looked (2019) the Sony offerings in this product category were not great. That might have changed. But I fear that the manufacturers have collectively decided this product category isn't worth investing in.

2 October 2023

Jonathan Dowland: Promotion

It's been quiet here (I hope to change that), but I want to share some good news: I've been promoted to Principal Software Engineer! Next February will start my 9th year with Red Hat. Time flies when you're having fun!

6 September 2023

Jonathan Dowland: VW Lupo mirror-adjustment knob

VW Lupo wing mirror adjustment knob
I designed and printed a replacement knob for the wing-mirror adjustment on a Volkswagen Lupo. The original had a "pineapple" style texture on it. For my print, I sliced with Prusa Slicer and turned on "fuzzy surfaces" to get a texture on the grip-side of the knob.

29 August 2023

Jonathan Dowland: Gazelle Twin

A couple of releases A couple of releases
I discovered Gazelle Twin last year via Stuart Maconie's Freak Zone (two of her tracks ended up on my 2022 Halloween playlist). Through her website I learned of The Horror Show! exhibition at Somerset House1 in London that I managed to visit earlier this year. I've been intending to write a 5-track blog post (a la Underworld, the Cure, Coil) for a while but I have been spurred on by the excellent news that she's got a new album on the way, and, she's performing at the Sage Gateshead in November. Buy tickets now!! Here's the five tracks I recommend to get started:
  1. Anti-Body, from 2014's UNFLESH. I particularly love the percussion. Perc did a good hard-house-style remix on Fleshed Out, the companion remix album. Anti-Body by Gazelle Twin
  2. Fire Leap, from Gazelle Twin and NYX's collaborative album Deep England. The album is a re-interpretation of material from Gazelle Twin's earlier album, Pastoral, with the exception of this track, which is a cover of Paul Giovanni's song from The Wicker Man. There's a common aesthetic in all three works: eerie-folk, England's self-mythologising as seen through a warped and cracked lens. Anti-Body by Gazelle Twin There's a fantastic performance video of the whole album, directed by Iain Forsyth and Jane Pollard, available on YouTube.
  3. Better In My Day, from the aforementioned Pastoral. This track and this album are, I think, less accessible, more challenging than the re-interpreted material. That's not a bad thing: I'm still working on digesting it! This is one of the more abrasive, confrontational tracks. Better In My Day by Gazelle Twin
  4. I am Shell I am Bone, from way back to her first release, The Entire City in 2011. Composed, recorded, self-produced, self-released. It's remarkable to me how different each phase of GT's work are to one another. This album evokes a strong sense of atmosphere and place to me. There's a hint of possible influence of Joy Division's Unknown Pleasures (or New Order's Movement) in places. The b-side to this song is a cover of Joy Division's The Eternal. I Am Shell I Am Bone by Gazelle Twin
  5. GT re-issued This Entire City in 2022 along with a companion-piece EP of newly-released material from the same era, The Wastelands. This isn't cutting room floor stuff, though, as evidenced by the strength of my final pick, Hole in my Heart. Hole In My Heart by Gazelle Twin
It's hard to pick just five tracks when doing these (that's the point I suppose). I note that I haven't picked anything from her wide-ranging soundtrack work: her last three or four of her releases have been soundtrack works, released on the well-respected UK label Invada, as will be her forthcoming album. You can find all the released stuff on Gazelle Twin's Bandcamp Page.

  1. I recently learned that PJ Harvey once hosted album sessions in Somerset House, and allowed fans to watch her at work during the recording: https://www.theguardian.com/music/2015/jan/16/pj-harvey-somerset-house-recording-in-progress

21 August 2023

Jonathan Dowland: FreshRSS

Now that it's more convenient for me to run containers at home, I thought I'd write a bit about web apps I am enjoying. First up, FreshRSS, a web feed aggregator. I used to make heavy use of Google Reader until Google killed it, and although a bunch of self-hosted cloned sprung up very quickly afterwards, I didn't transition to any of them. Then followed a number of years within which, in retrospect, I basically didn't do a great job of organising reading the web. This lasted until a couple of years ago when, on a whim, I tried out NetNewsWire for iOS. NetNewsWire is a well-established and much-loved feed reader for Mac which I've never used. I used the iOS version in isolation for a long time: only dipping into web feeds on my phone and never on another device. I'd like to see the old web back, and do my part to make that happen. I've continually published rss and atom feeds for my own blog. I'm also trying to blog more: this might be easier now that Twitter (which, IMHO, took a lot of the energy for quick writing out of blogging) is mortally wounded. So, I'm giving FreshRSS a go. Early signs are good: it's fast, lightweight, easy to use, vaguely resembles how I remember Google Reader, and has a native dark-mode. It's still early days building up a new list of feeds to follow. I'll be sure to share interesting ones as I discover them!

13 August 2023

Jonathan Dowland: Terrain base for 3D castle

terrain base for the castle
I designed and printed a "terrain" base for my 3D castle in OpenSCAD. The castle was the first thing I designed and printed on our (then new) office 3D printer. I use it as a test bed if I want to try something new, and this time I wanted to try procedurally generating a model. I've released the OpenSCAD source for the terrain generator under the name Zarchscape. mid 90s terrain generation
Lots of mid-90s games had very boxy floors Lots of mid-90s games had very boxy floors
Terrain generation, 90s-style. From [this article](https://web.archive.org/web/19990822085321/http://www.gamedesign.net/tutorials/pavlock/cool-ass-terrain/) Terrain generation, 90s-style. From this article
Back in the 90s I spent some time designing maps/levels/arenas for Quake and its sibling games (like Half-Life), mostly in the tool Worldcraft. A lot of beginner maps (including my own), ended up looking pretty boxy. I once stumbled across an article blog post that taught my a useful trick for making more natural-looking terrain. In brief: tessellate the floor region with triangle polygons, then randomly add some jitter to the z-dimension for their vertices. A really simple technique with fairly dramatic results. OpenSCAD Doing the same in OpenSCAD stretched me, and I think stretched OpenSCAD. It left me with some opinions which I'll try to write up in a future blog post. Final results
multicolour
I've generated and printed the result a couple of times, including an attempt a multicolour print. At home, I have a large spool of brown-coloured recycled PLA, and many small lengths of samples in various colours (that I picked up at Maker Faire Czech Republic last year), including some short lengths of green. My home printer is a Prusa Mini, and I cheaped out and didn't buy the filament runout sensor, which would detect when the current filament ran out and let me handle the situation gracefully. Instead, I added several colour change instructions to the g-code at various heights, hoping that whatever plastic I loaded for each layer was enough to get the print to the next colour change instruction. The results are a little mixed I think. I didn't catch the final layer running out in time (forgetting that the Bowden tube also means I need to catch it running out before the loading gear, a few inches earlier than the nozzle), so the final lush green colour ends prematurely. I've also got a fair bit of stringing to clean up. Finally, all these non-flat planes really show up some of the limitations of regular Slicing. It would be interesting to try this with a non-planar Slicer.

1 August 2023

Jonathan Dowland: Interzone's new home

IZ #294, the latest issue IZ #294, the latest issue
The long running British1 SF Magazine Interzone has a new home and new editor, Gareth Jelley, starting with issue 294. It's also got a swanky new format ("JB6"): a perfect-bound, paperback novel size, perfect for fitting into an oversize coat or jeans pocket for reading on the train. I started reading Interzone in around 2003, having picked up an issue (#176) from Feb 2002 that was languishing on the shelves in Forbidden Planet. Once I discovered it I wondered why it had taken me so long. That issue introduced me to Greg Egan. I bought a number of back issues on eBay, to grab issues with stories by people including Terry Pratchett, Iain Banks, Alastair Reynolds, and others.
IZ #194: The first by TTA press IZ #194: The first by TTA press
A short while later in early 2004, after 22 years, Interzone's owner and editorship changed from David Pringle to Andy Cox and TTA Press. I can remember the initial transition was very jarring: the cover emphasised expanding into coverage of Manga, Graphic Novels and Video Games (which ultimately didn't happen) but after a short period of experimentation it quickly settled down into a similarly fantastic read. I particularly liked the move to a smaller, perfect-bound form-factor in 2012. I had to double-check this but I'd been reading IZ throughout the TTA era and it lasted 18 years! Throughout that time I have discovered countless fantastic authors that I would otherwise never have experienced. Some (but by no means all) are Dominic Green, Daniel Kaysen, Chris Beckett, C cile Cristofari, Aliya Whiteley, Tim Major, Fran oise Harvey, Will McIntosh. Cox has now retired (after 100 issues and a tenure almost as long as Pringle) and handed the reins to Gareth Jelley/MYY Press, who have published their first issue, #294. Jelley is clearly putting a huge amount of effort into revitalizing the magazine. There's a new homepage at interzone.press but also companion internet presences: a plethora of digital content at interzone.digital, Interzone Socials (a novel idea), a Discord server, a podcast, and no doubt more. Having said that, the economics of small magazines have been perilous for a long time, and that hasn't changed, so I think the future of IZ (in physical format at least) is in peril. If you enjoy short fiction, fresh ideas, SF/F/Fantastika; why not try a subscription to Interzone, whilst you still can!

  1. Interzone has always been "British", in some sense, but never exclusively so. I recall fondly a long-term project under Pringle to publish a lot of Serbian writer Zoran ivkovi , for example, and the very first story I read was by Australian Greg Egan. Under Jelley, the magazine is being printed in Poland and priced in Euros. I expect it to continue to attract and publish writers from all over the place.

18 July 2023

Jonathan Dowland: Bea's 3D printer

My daughter Beatrice asked for me to print her a 3D printer.
  Bea's 3D printer   Bea's 3D printer
Most of the model is from https://www.printables.com/model/355917-miniature-3d-printer-model-toy, and the unicorn is from https://www.printables.com/model/385926-unicorn.

15 June 2023

Jonathan Dowland: containers as first-class network citizens

I've moved to having containers be first-class citizens on my home network, so any local machine (laptop, phone,tablet) can communicate directly with them all, but they're not (by default) exposed to the wider Internet. Here's why, and how. After I moved containers from docker to Podman and systemd, it became much more convenient to run web apps on my home server, but the default approach to networking (each container gets an address on a private network between the host server and containers) meant tedious work (maintaining and reconfiguring a HTTP reverse proxy) to make them reachable by other devices. A more attractive arrangement would be if each container received an IP from the range used by my home LAN, and were automatically addressable from any device on it. To make the containers first-class citizens on my home LAN, first I needed to configure a Linux network bridge and attach the host machine's interface to it (I've done that many times before); then define a new Podman network, of type "bridge". podman-network-create (1) serves as reference, but the blog post Exposing Podman containers fully on the network is an easier read (skip past the macvlan bit). I've opted to choose IP addresses for each container by hand. The Podman network is narrowly defined to a range of IPs that are within the subnet that my ISP-provided router uses, but outside the range of IPs that it allocates. When I start up a container by hand for the first time, I choose a free IP from the sub-range by hand and add a line to /etc/avahi/hosts on the parent machine, e.g.
192.168.1.33 octoprint.local
I then start the container specifying that address, e.g.
podman run --rm -d --name octoprint \
        ...
        --network bridge_local --ip 192.168.1.33 \
        octoprint/octoprint
I can now access that container from any device in my house (laptop, phone, tablet...) via octoprint.local. What's next Although it's not a huge burden, it would be nice to not need to statically define the addresses in /etc/avahi/hosts (perhaps via "IPAM"). I've also been looking at WireGuard (which should be the subject of a future blog post) and combining this with that would be worthwhile.

23 May 2023

Jonathan Dowland: neovim plugins and distributions

I've been watching the neovim community for a while and what seems like a cambrian explosion of plugins emerging. A few weeks back I decided to spend most of a "day of learning" on investigating some of the plugins and technologies that I'd read about: Language Server Protocol, TreeSitter, neorg (a grandiose organiser plugin), etc. It didn't go so well. I spent most of my time fighting version incompatibilities or tracing through scant documentation or code to figure out what plugin was incompatible with which other. There's definitely a line where crossing it is spending too much time playing with your tools instead of creating. On the other hand, there's definitely value in honing your tools and learning about new technologies. Everyone's line is probably in a different place. I've come to the conclusion that I don't have the time or inclination (or both) to approach exploring the neovim universe in this way. There exist a number of plugin "distributions" (such as LunarVim): collections of pre- configured and integrated plugins that you can try to use out-of-the-box. Next time I think I'll pick one up and give that a try &emdash independently from my existing configuration &emdash and see which ideas from it I might like to adopt. shared vimrcs Some folks upload their vim or neovim configurations in their entirety for others to see. I noticed Jess Frazelle had published hers so I took a look. I suppose one could evaluate a bunch of plugins and configuration in isolation using a shared vimrc like this, in the same was as a distribution. bufferline Amongst the plugins she uses was bufferline, a plugin to re-work neovim's tab bar to behave like tab bars from more conventional editors1. I don't make use of neovim's tabs at all2, so I would lose nothing having the (presently hidden) tab bar reworked, so I thought I'd give it a go. I had to disable an existing plugin lightline, which I've had enabled for years but I wasn't sure I was getting much value from. Apparently it also messes with the tab bar! Disabling it, at least for now, at least means I'll find out if I miss it. I am already using vim-buffergator as a means of seeing and managing open buffers: a hotkey opens a sidebar with a list of open buffers, to switch between or close. Bufferline gives me a more immediate, always-present view of open buffers, which is faintly useful: but not much. Perhaps I'd like it more if I was coming from an editor that had made it more of an expected feature. The two things I noticed about it that aren't especially useful for me: when browsing around vimwiki pages, I quickly open a lot of buffers. The horizontal line fills up very quickly. Even when I don't, I habitually have quite a lot of buffers open, and the horizontal line is quickly overwhelmed. I have found myself closing open buffers with the mouse, which I didn't do before. vert Since I have brought up a neovim UI feature (tabs) I thought I'd briefly mention my new favourite neovim built-in command: vert. Quite a few plugins and commands open up a new window (e.g. git-fugitive, Man, etc.) and they typically do so in a horizontal split. I'm increasingly preferring vertical splits. Prefixing any3 such command with vert forces the split to be vertical instead.

  1. in this case the direct influence was apparently DOOM Emacs
  2. (neo)vim's notion of tabs is completely different to what you might expect from other UI models.
  3. at least, I haven't found one that doesn't work yet

5 May 2023

Jonathan Dowland: sidebar dividers for mutt

I wanted to start using (neo)mutt's sidebar and I wanted a way of separating groups of mail folders in the list. To achieve that I interleaved a couple of fake "divider" folder names. It looks like this:
  Screenshot of neomutt with sidebar   Screenshot of neomutt with sidebar
This was spurred on by an attempt to revamp my personal organisation. I've been using mutt for at least 20 years (these days neomutt), which, by default, does not show you a list of mail folders all the time. The default view is an index of your default mailbox, from which you can view a mail (pager view), switch to a mailbox, or do a bunch of other things, some of which involve showing a list of mailboxes. But the list is not omnipresent. That's somewhat of a feature, if you believe that you don't need to see that list until you are actually planning to pick from it. There's an old and widespread "sidebar" patch for mutt (which neomutt ships out of the box). It reserves a portion of the left-hand side of the terminal to render a list of mailboxes. It felt superfluous to me so I never really thought to use it, until now: I wanted to make my Inbox functional again, and to achieve that, I needed to move mail out of it which was serving as a placeholder for a particular Action, or as a reminder that I was Waiting on a response. What was stopping me was a feeling I'd forget to check other mailboxes. So, I need to have them up in my face all the time to remind me. Key for me, to make it useful, is to control the ordering of mailboxes and to divide them up using the interleaved fake mailboxes. The key configuration is therefore
set sidebar_sort_method = 'unsorted'
mailboxes =INBOX =Action =Waiting
mailboxes '=   ~~~~~~~~' # divider
...
My groupings, for what it's worth, are: the key functional mailboxes (INBOX/Action/Waiting) come first; last, is reference ('2023' is the name of my current Archive folder; the other folders listed are project-specific reference and the two mailing lists I still directly subscribe to). Sandwiched in between is currently a single mailbox which is for a particular project for which it makes sense to have a separate mailbox. Once that's gone, so will that middle section. For my work mail I do something similar, but the groupings are
  1. INBOX/Action/Waiting
  2. Reference (Sent Mail, Starred Mail)
  3. More reference (internal mailing lists I need to closely monitor)
  4. Even more reference (less important mailing lists)
As with everything, these approaches are under constant review.

24 April 2023

Jonathan Dowland: Separate hledgers

In a previous blog post I described the use of virtual postings to track accidental personal/family expenses. I've always been uncomfortable with that, and in hledger 1yr I outlined a potential scheme for finally addressing the virtual posting problem. separate journals My outline built on top of continuing to maintain both personal and family financial data in the same place, but I've decided that this can't work, because the different "directions" (or signs) of accidental transactions originating from either the family or personal side can't be addressed with any kind of alternate view on the same data. To illustrate with an example. A negative balance in family:liabilities:jon means "family owes jon". A coffee bought by mistake on the family credit card will have a negative posting on the credit card, and thus a positive one on the liabilities account. ("jon owes family"). That's fine. But what about when I buy family stuff on a personal card? The other side of of the transaction is also going to have a positive sign, so it can't be posted to family:liabilities:jon: it would have to go to somewhere else, like jon:liabilities:family. Now I have two accounts which track versions of the same thing, and they cannot be combined with a simple transaction since they're looking at the same value from opposite directions (and signs). Back when I first described the problem I was using a single journal file for all my transactions. After moving to lots of separate journal files (in hledger 1yr), it's become clearer to me that I don't need to maintain the Family and Personal data together, at all: they can be entirely separate journals. getting data between journals When I moved to a new set of ledger files for 2023, I needed to carry forward the balances from 2022 in the form of "opening balance" transactions. This was achieved by a report on the 2022 data, exported as CSV, and imported into the 2023 data (all following the scheme outlined by fully-fledged hledger.)) Separate Personal and Family journals need some information from each other, and I can achieve that in the same way as for opening balances: with an export of the relevant transactions as CSV, then imported on the other side. HLedger's CSV import system is flexible enough that we can effectively invert the sign of liabilities, addressing the problem above. Worked example We start with an accidental coffee purchased on the family card (and so this belongs to the Family ledger)
2022-08-20 coffee
    liabilities:creditcard             -3
    liabilities:jon:expenses:coffee     3
I've encoded the expense category that the Personal ledger will be interested in (the last bit, expenses:coffee) as a sub-account of the liabilities category that the Family ledger is interested in1 (the first bit, liabilities:jon). When viewed on the Family side, the expense category is not interesting, and we can hide it with HLedger's alias feature2:
    alias /^liabilities:jon(.*)$/ = liabilities:jon
It then looks like this from the Family side:
2022-08-20 coffee
    liabilities:creditcard             -3
    liabilities:jon                     3
This transaction (and others like it) are exported via
hledger reg -f family/2023-back.journal liabilities:jon: -O csv \
        jon/import/family/liabilities.csv
(The trailing colon on liabilities:jon: is important here!) In the resulting CSV file, the running example transaction looks like
"55","2022-08-20","","coffee","liabilities:jon:expenses:coffee","  3.00","  3.00"
This is then converted into a journal file by hledger import. The rules file for the import is very simple: the fields date, description, account1 and amount are taken as-is; account2 is hard-coded to liabilities:family. The resulting transaction looks like
2022-08-20 coffee
    liabilities:jon:expenses:coffee     3
    liabilities:family                 -3
Before this journal is included by the main one, we have to adjust the expense account, to remove the liabilities:jon: prefix. The import rules can't do this3 , so we use another journal file as a go-between with another alias rule:
    alias /^liabilities.jon:/ =
This results, finally, in the following transaction in the Personal ledger:
2022-08-20 coffee
    expenses:coffee                     3
    liabilities:family                 -3
avoiding double-counting There's one set of transactions that we don't want to export across this divide, and that's because they're already there: any time I transfer money from myself to the family accounts (or vice versa) to address the accrued debt, the transaction is visible from both my family and personal statements. To avoid exporting these and double-counting them, I make sure those transactions don't post to an account matching the pattern used in the hledger reg report. That's what the trailing colon is for: It ensures I only export transactions which are to a sub-account of liabilities:jon, and not to the root account liabilities:jon itself: which is where I put the repayment transactions. I could instead use a more explicit sub-account like liabilities:jon:repayments or similar, since the trailing colon is quite subtle, but this works for me. Wrap up I've been really on the fence as to whether the complexity of this scheme is worth it to avoid the virtual postings. The previous scheme was much simpler. I have definitely made some mistakes with it, which didn't get caught by the double-entry rules that virtual postings ignore, but they're for small sums of money anyway. On the other hand, a lot of the "machinery" of this already existed for getting opening balances between calendar years, and the gory details are written down and hidden inside the Makefile. I also expect that I will continue to see advantages in having Family and Personal entirely separate, as they can each develop and adapt to their own needs without having to consider the other side of things every time. It's a running experiment, and time will tell if it's a good idea.

  1. This scheme was originally suggested to me by Pranesh on Twitter (described in dues), but I discounted it at the time because of the exact arrangement they suggested, not realising the broader idea might work.
  2. I've hand-waved one problem with using hledger aliases here. If we use them as described, to hide the Personal expense details, we need them to not be applied when performing the CSV-generating report. Therefore, in practise I have them in a front-most family/2023.journal file, which imports the data from another family/2023-back.journal, and the CSV export is performed on the backing journal with the data and not the alias.
  3. HLedger import rules can't manipulate the fields from the CSV a great deal, but one change I proposed and started hacking on would allow for this: to expose Regexp match-groups as interpolatable tokens: https://github.com/simonmichael/hledger/issues/2009.

12 April 2023

Jonathan Dowland: blog after death

I've been pondering what should happen to personal websites once the owner has passed, or otherwise "moved on". Some time ago I stumbled across a blog by Kev Quirk, who wrote
I d need to come up with contingency plans for...This website, and any other websites and own/manage
I thought it was a strange idea, to have a contingency plan for a personal site to survive its writer. Even a site such as my own, which (as sites go) would be trivial to host/mirror (since it's static), who would want to do that? Why would they do that? On the other hand, whether something I've written is useful or not is largely independent of whether I'm alive and well. And if anything I've written is useful independently from me (to pick a recent example, my notes on imaging optical media), should it persist somewhere? I've long thought of personal sites much like any website, which is to say, implicitly eternal, despite mountains of evidence that practically the reverse is true. I was influenced a long time ago by the article Cool URIs don't change. Thinking more about "digital legacy" led me to start believing that personal sites are ultimately not the right place for any kind of content that should have some persistence. (You might well have started with that assumption!) Yesterday via Planet Debian I read Aur lien Jarno's recent blog post where they have deprecated all of their personal site bar their blog. Quoting Aur lien:
Wikipedia is a much better platform for sharing knowledge than random websites.
This struck me as an interesting idea. Wikipedia is a reasonable place for a lot of material that might otherwise be hosted on personal sites, and as a "living website", content can be adapted, corrected, etc. over time. Wikipedia is clearly not the right place for much other material that might exist on personal sites (it's not the right place for my aforementioned article). I'm not sure where might be. Perhaps we, as a culture, need to move away from the notion of personal sites, and devise a more collective concept. Or perhaps it's no bad thing that, without intervention, this stuff disappears.

28 March 2023

Jonathan Dowland: daily log

I was asked on the Fediverse1 to describe my $dayjob daily workflow. Blogging about it seemed like a good opportunity to take stock of my current setup. The core of my approach is to maintain a daily log, rather like an engineering Lab Book2. My number one tip for anyone starting a career in Software Engineering (or for that matter systems administration) is to keep a daily log of what you intended to do, what you actually did, what you changed, what changed as a consequence, etcetera. I've already written about the tools I use for that: vimwiki. They key drawback of that system is managing TODO lists. In Vimwiki, tasks look like this:
  * [ ] buy milk (still todo)
  * [X] clean the car (done)
These get dispersed or duplicated around different days/wiki pages. It's hard to get a current view of open (or otherwise relevant) tasks, and impractical to update many copies of the same task when its state changes (such as when it's done). The solution I've adopted for now is another Vim plugin, taskwiki, which synchronises tasks with Taskwarrior3, an external task-management tool. If I mark a task as "done", Taskwiki updates all references to that task to reflect the new state. I can also construct queries to list all tasks matching some criteria. I have a special Vimwiki page named "Backlog" which runs the query "all tasks tagged 'redhat' in state 'pending'" (Linked from the boilerplate at the top of every page I write, for quick access):
= Backlog   +redhat status:pending =
* [ ] buy milk (still todo)
...
Much like the base Vimwiki plugin, Taskwiki is very opinionated, and I've had to tame it by disabling several of its features. I've also hit a couple of mildly frustrating bugs (#368, #425). I might one day have a go at writing an alternative, simpler plugin in Lua (Neovim's native scripting language), but for now it works well enough and I don't have the time. There's very little in this current workflow for managing scheduling tasks, and that's probably where the focus should be for my next iterative improvement efforts. I think Taskwarrior, the underlying tool, has some good support for that. I'd particularly like some more visual approaches for managing the backlog, such as something Kanban-style.

  1. sorry I can't find who/the toot any more
  2. Analogous to an engineering lab book, but the analogy is not exact. I recently read a little bit about what Engineers are actually taught to do in a lab book, and it's quite a bit more narrowly-scoped than I realised. This slide-deck is interesting: https://www.training.nih.gov/assets/Lab_Notebook_508_(new).pdf
  3. I rarely use Taskwarrior directly yet, and so I haven't written about it independently of Taskwiki..

27 March 2023

Jonathan Dowland: Imaging Optical Media, Part 3: Figuring out disc contents

Too many years ago, I started (but did not finish) a series of blog posts on the topic of Imaging Optical Media. I was writing it as I was figuring out the process to use whilst importing my own piles of home-made CD-Rs and DVD-Rs to a more suitable storage. Back in 2018 Antoine Beaupr blogged about being inspired by my article series to sort out his optical media collection, and wrote up his notes so far. This inspired me (in 2018!) to change the approach for writing up what I'd been doing. Instead of a series of blog posts, I've dumped all the material I have written in one place: imaging discs. I've also reduced the scope somewhat, deciding that "Organising the extracted files" is a large enough (and orthogonal) topic to deserve its own page (if I ever write anything about it). The "new" material is really two unfinished blog posts: ?figuring out disc contents and ?Retrying damaged/degraded discs. I post them now in the hope that they are useful despite being unfinished.

18 March 2023

Jonathan Dowland: Qi charger stand

I've got a Qi-charging phone cradle at home which orients the phone up at an angle which works with Apple's Face ID. At work, I've got a simpler "puck"-shaped one which is less convenient, so I designed a basic cradle to raise both the charger and the phone up.
cradle without phone
I did two iterations, and the second iteration was "good enough" to use that I stopped there, although I would make some further alterations if I was to print it again: more of a cut-out for the USB-C cable, raise the plinth for the Qi charger so that USB-C cables with long collars have enough room, elongate the base to compensate for the changed weight distribution.
cradle with phone

7 March 2023

Jonathan Dowland: Welcome Oblivion 10th Anniversary

I haven t done one of these for a while, and they ll be less frequent than I once planned as I m working from home less and less. I'm also trying to get back into exploring my digital music collection, and more generally engaging with digital music again.
 picture of a vinyl record
It s the ten year anniversary of the first (and last) LP by How To Destroy Angels (HTDA), the side-project of Trent Reznor with his wife, his Nine Inch Nails (NIN) partner in crime Atticus Ross and visual artist (and NIN artistic director) Rob Sheridan. This album was a real pleasure. For NIN fans, it wasn't clear what the future held after the start of HTDA. But this work really stood alone, similar in some ways to NIN but sufficiently different to be fresh and exciting. In stark contrast to NIN (at the time), it was interesting to see the members of HTDA presented on an equal footing, especially Rob Sheridan, who wasn't a musician. The intent was to try and put the visual work on the same level of esteem as the musical. HTDA performed a few live shows, but none outside the US. They were apparently quite a spectacle. As an artefact, this is a gorgeous LP. The gatefold cover and all four sides of the two record sleeves are covered in unique pieces of Sheridan's glitch art. When I originally bought this I had a rather generously-sized individual office at the University, so I framed and displayed many of these pieces on my office walls. Sheridan has since written extensively on the processes and techniques he used for this style of art, and has produced many more works using the same techniques. You can see some on his website, patreon, fine art print shop or Threadless store. Late last year I treated myself to a large print of some related work, analog(Oblivion)000b, which (once the framing is done) I'm going to hang in my home office. The LP had two tracks that were not present in the CD or digital release versions of the album, although a CD was bundled in the LP which included the tracks. (The Knife did something similar with Shaking the Habitual, at around the same time). I've had some multitrack stems from this album sitting in my "for archive.org" folder for a while, so I took the opportunity of the 10th anniversary to upload them, here: https://archive.org/details/htda_multitracks

Jonathan Dowland: date warping in HLedger

My credit card and bank account rarely agree on the date for when I pay it off1. Since I added balance assertions for bank account transactions, I need the transaction in my ledger to match what the bank thinks, otherwise the balance assertions would start to fail. The skew is not normally more than a couple of days, and could be corrected by changing the date for just one of the two postings. But the skew is not very important, and altering the posting date could be used for something more useful. date warping credit card repayments My credit card bills land halfway through the month, so February's bill covers transactions between January 15th and February 14th. I pay off the bill in full each month using Direct Debit. The credit card company consider the bill paid immediately, but they don't actually draw it until the end of the month (Jan 31 in the running example). This means the payment transaction for a given month lands halfway through the period covered by the next month's bill. The credit card bill itself shows the payment date at the end of the month but presents the transaction "warped" right to the start. This is actually useful, because it means the balance is zero for the first purchase on the bill. The credit card data in CSV form has the repayment transaction at the date it occurred, not warped to the start of the period. When I import this into HLedger, the credit card account balance for each new transaction does not match the statement right up to the point of the repayment, half way through. This makes spot-checking that the imported data matches the statement a bit more awkward. So, I have started "warping" the payment transaction to the start of the billing period, just like the credit card statement does:
2022-12-31  pay credit card
  asset:bank      -500
  liabilities:credit card  ;  date:2022-12-15
I can then spot-check the transactions in HLedger after import, in particular the final one, and the final account balance, and then write a manual balance assertion when I'm finished. I'd quite like to automate the adjusted posting date, too, but I haven't figured out how to do that just yet. date warping for refunds Another thing I've found "date warping" useful is for marrying up refunds with their related purchase. Imagine I spent 200 on some shoes in late January, but returned most of them in early February:
2023-01-25  buy some shoes. hedging on the size
  liabilities:credit card      -200
  expenses:shoes
2023-02-05  return the ones that don't fit
  liabilities:credit card       150
  expenses:shoes
If I look at how much I've spent on shoes per month, it looks odd: 200 in January (although ultimately I only spent 50), and -150 in February.
Balance changes in 2023-01-01..2023-02-28:
$ hledger bal -Mt expenses:shoes
                    Jan     Feb 
================++===============
 expenses:shoes     200    -150 
----------------++---------------
                    200    -150 
By "warping" the refund's posting to the expense account to the purchase date, how much I ultimately spent on shoes is more properly reflected:
2023-02-05  return the ones that don't fit
  liabilities:credit card       150
  expenses:shoes  ; date:2023-01-25
resulting in
$ hledger bal -Mt expenses:shoes
Balance changes in 2023-01-01..2023-02-28:
                   Jan  Feb 
================++===========
 expenses:shoes     50    0 
----------------++-----------
                    50    0 
I suppose whether you'd want to do this is a matter of taste.

  1. Amazon rarely agrees with my bank on when we've paid for things either. For other reasons, Amazon is a beast to tackle in another blog post.

13 February 2023

Jonathan Dowland: A visit to Prusa Labs

.
.

In September I was in Czechia for a Red Hat event. I ended up travelling via Prague, and had an unexpected extra day due to an airline strike causing my flight home to be cancelled. I took the opportunity to visit Prusa's offices/factory/Lab, and it was amazing! The Prusa team were all busy getting ready for the Prague Maker Faire that was happening the day afterwards.1 On arriving at the street which houses Prusa's Lab and Office buildings, the first thing that hit me was the smell. I find the melted-plastic smell of FDM printing (with PLA, at least) quite pleasant, and this was a super-condensed version of that, pumping out of their ground-floor windows. I started at the reception area on the ground floor. Outside reception there's a lovely sculpture representing the history of the development of the MK3S+.
The Reception The Reception
SLS Farm in the former Hack lab SLS Farm in the former Hack lab
History of the MK3S+ History of the MK3S+

At the reception you have a small waiting area with shelves of demonstration prints and some spools of Prusament. From here, our kind guide first took us to a region on the ground floor that used to be (prior to COVID times) the public maker/hack lab. The lab contained two modest farms of printers: one of their flagship FDM printer, the MK3S+, and another of their SLS resin printers. A close up of some example resin models is pictured above. The bicycle was tiny: about the size of a thumbnail.
A Historic display A Historic display
Bespoke QE equipment Bespoke QE equipment
The MK3S+ Farm The MK3S+ Farm

. Moi in the Farm
2KG orange PLA spools for the Farm 2KG orange PLA spools for the Farm
The rest of the ground floor area was full of heavy machinery and prototyping equipment. Onwards we went to the upstairs floors. Upstairs, past a nice graphic of Prusa's historic products, we visited the assembly and QE testing areas. They have a very organised system of parts buckets and some thorough QE processes, including some bespoke equipment that produces the "receipt" of tests and calibration that they provide for you in the box when you buy a 3D printer. After that, we visited the production Farm: a large room full of MK3S+ printers churning out parts for other printers. The noise was remarkable. The printers were running custom firmware to continually print the part they had been set for. Some of the printers were designated for printing with ASA: they were colour-coded (yellow controller surround) and within boxed regions to prevent the fumes causing problems. (picture at the top) Outside the room sat palettes of orange PETG plastic for the printer farm on 2KG spools (not a size they sell to the public just yet) The final part of the trip was outside, to the real farm: Prusa have a smallholding with Alpacas at the rear of their estate. Whilst we visited it, Josef Prusa himself turned up (in a snazzy looking custom colour Tesla) to feed the animals, say hello and pose for a picture. Overall, it was a fantastic visit. I'm very grateful to Air France for cancelling my flight home, and to Prusa Labs (in particular Luk ) for allowing me to come and say hello!

  1. I managed to squeeze that in too on my way to my rescheduled flight, although it was a rush visit and I don't have much to say or show from it. Suffice to say that it was lovely and bittersweet since the UK Maker Faire used to be hosted in my fair home city of Newcastle before they stopped.

10 February 2023

Jonathan Dowland: HLedger, 1 year on

It's been a year since I started exploring HLedger, and I'm still going. The rollover to 2023 was an opportunity to revisit my approach. Some time ago I stumbled across Dmitry Astapov's HLedger notes (fully-fledged hledger, which I briefly mentioned in eventual consistency) and decided to adopt some of its ideas. new year, new journal First up, Astapov encourages starting a new journal file for a new calendar year. I do this for other, accounting-adjacent files as a matter of course, and I did it for my GNUCash files prior to adopting HLedger. But the reason for those is a general suspicion that a simple mistake with those softwares could irrevocably corrupt my data. I'm much more confident with HLedger, so rolling over at years end isn't necessary for that. But there are other advantages. A quick obvious one is you can get rid of old accounts (such as expense accounts tied to a particular project, now completed). one journal per import In the first year, I periodically imported account data via CSV exports of transactions and HLedger's (excellent) CSV import system. I imported all the transactions, once each, into a single, large journal file. Astapov instead advocates for creating a separate journal for each CSV that you wish to import, and keep around the CSV, leaving you with a 1:1 mapping of CSV:journal. Then use HLedger's "include" mechanism to pull them all into the main journal. With the former approach, where the CSV data was imported precisely, once, it was only exposed to your import rules once. The workflow ended up being: import transactions; notice some that you could have matched with import rules and auto-coded; write the rule for the next time. With Astapov's approach, you can re-generate the journal from the CSV at any point in the future with an updated set of import rules. tracking dependencies Now we get onto the job of driving the generation of all these derivative journal files. Astapov has built a sophisticated system using Haskell's "Shake", which I'm not yet familiar, but for my sins I'm quite adept at (GNU-flavoured) UNIX Make, so I started building with that. An example rule
import/jon/amex/%.journal: import/jon/amex/%.csv rules/amex.csv.rules
    rm -f $(@D)/.latest.$*.csv $@
    hledger import --rules-file rules/amex.csv.rules -f $@ $<
This captures the dependency between the journal and the underlying CSV but also to the relevant rules file; if I modify that, and this target is run in the future, all dependent journals should be re-generated.1 opening balances It's all fine and well starting over in a new year, and I might be generous to forgive debts, but I can't count on others to do the same. We need to carry over some balance information from one year to the next. Astapov has a more complex (or perhaps featureful) scheme for this involving a custom Haskell program, but I bodged something with a pair of make targets:
import/opening/2023.csv: 2022.journal
    mkdir -p import/opening
    hledger bal -f $< \
                $(list_of_accounts_I_want_to_carry_over) \
        -O csv -N > $@
import/opening/2023.journal: import/opening/2023.csv rules/opening.rules
    rm -f $(@D)/.latest.2023.csv $@
    hledger import --rules-file rules/opening.rules \
        -f $@ $<
I think this could be golfed into a year-generic rule with a little more work. The nice thing about this approach is the opening balances for a given year might change, if adjustments are made in prior years. They shouldn't, for real accounts, but very well could for more "virtual" liabilities. (including: deciding to write off debts.) run lots of reports Astapov advocates for running lots of reports, and automatically. There's a really obvious advantage of that to me: there's no chance anyone except me will actually interact with HLedger itself. For family finances, I need reports to be able to discuss anything with my wife. Extending my make rules to run reports is trivial. I've gone for HTML reports for the most part, as they're the easiest on the eye. Unfortunately the most useful report to discuss (at least at the moment) would be a list of transactions in a given expense category, and the register/aregister commands did not support HTML as an output format. I submitted my first HLedger patch to add HTML output support to aregister: https://github.com/simonmichael/hledger/pull/2000 addressing the virtual posting problem I wrote in my original hledger blog post that I had to resort to unbalanced virtual postings in order to record both a liability between my personal cash and family, as well as categorise the spend. I still haven't found a nice way around that. But I suspect having broken out the journal into lots of other journals paves the way to a better solution to the above. The form of a solution I am thinking of is: some scheme whereby the two destination accounts are combined together; perhaps, choose one as a primary and encode the other information in sub-accounts under that. For example, repeating the example from my hledger blog post:
2022-01-02 ZTL*RELISH
    family:liabilities:creditcard        -3.00
    family:dues:jon                       3.00
    (jon:expenses:snacks)                 3.00
This could become
2022-01-02 ZTL*RELISH
    family:liabilities:creditcard        -3.00
    family:liabilities:jon:snacks
(I note this is very similar to a solution proposed to me by someone responding on twitter). The next step is to recognise that sometimes when looking at the data I care about one aspect, and at other times the other, but rarely both. So for the case where I'm thinking about family finances, I could use account aliases to effectively flatten out the expense category portion and ignore it. On the other hand, when I'm concerned about how I've spent my personal cash and not about how much I owe the family account, I could use aliases to do the opposite: rewrite-away the family:liabilities:jon prefix and combine the transactions with the regular jon:expenses account heirarchy. (this is all speculative: I need to actually try this.) catching errors after an import When I import the transactions for a given real bank account, I check the final balance against another source: usually a bank statement, to make sure they agree. I wasn't using any of the myriad methods to make sure that this remains true later on, and so there was the risk that I make an edit to something and accidentally remove a transaction that contributed to that number, and not notice (until the next import). The CSV data my bank gives me for accounts (not for credit cards) also includes a 'resulting balance' field. It was therefore trivial to extend the CSV import rules to add balance assertions to the transactions that are generated. This catches the problem. There are a couple of warts with balance assertions on every such transaction: for example, dealing with the duplicate transaction for paying a credit card: one from the bank statement, one from the credit card. Removing one of the two is sufficient to correct the account balances but sometimes they don't agree on the transaction date, or the transactions within a given day are sorted slightly differently by HLedger than by the bank. The simple solution is to just manually delete one or two assertions: there remain plenty more for assurance. going forward I've only scratched the surface of the suggestions in Astapov's "full fledged HLedger" notes. I'm up to step 2 of 14. I'm expecting to return to it once the changes I've made have bedded in a little bit. I suppose I could anonymize and share the framework (Makefile etc) that I am using if anyone was interested. It would take some work, though, so I don't know when I'd get around to it.

  1. the rm latest bit is to clear up some state-tracking files that HLedger writes to avoid importing duplicate transactions. In this case, I know better than HLedger.

Next.

Previous.